Manifold Optimization-Assisted Gaussian Variational Approximation

نویسندگان

چکیده

Gaussian variational approximation is a popular methodology to approximate posterior distributions in Bayesian inference, especially high-dimensional and large data settings. To control the computational cost, while being able capture correlations among variables, low rank plus diagonal structure was introduced previous literature for covariance matrix. For specific learning task, uniqueness of solution usually ensured by imposing stringent constraints on parameterized matrix, which could break down during optimization process. In this article, we consider two special structures applying Stiefel manifold Grassmann constraints, address difficulty such factorization architectures. speed up updating process with minimum hyperparameter-tuning efforts, design new schemes Riemannian stochastic gradient descent methods compare them other existing optimizing manifolds. addition fixing identification issue, results from both simulation empirical experiments prove ability proposed obtaining competitive accuracy comparable converge large-scale tasks. Supplementary materials article are available online.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Variational Gaussian Approximation Revisited

The variational approximation of posterior distributions by multivariate gaussians has been much less popular in the machine learning community compared to the corresponding approximation by factorizing distributions. This is for a good reason: the gaussian approximation is in general plagued by an Omicron(N)(2) number of variational parameters to be optimized, N being the number of random vari...

متن کامل

Variational Gaussian approximation for Poisson data

The Poisson model is frequently employed to describe count data, but in a Bayesian context it leads to an analytically intractable posterior probability distribution. In this work, we analyze a variational Gaussian approximation to the posterior distribution arising from the Poisson model with a Gaussian prior. This is achieved by seeking an optimal Gaussian distribution minimizing the Kullback...

متن کامل

Manifold Optimization for Gaussian Mixture Models

We take a new look at parameter estimation for Gaussian Mixture Models (GMMs). In particular, we propose using Riemannian manifold optimization as a powerful counterpart to Expectation Maximization (EM). An out-of-the-box invocation of manifold optimization, however, fails spectacularly: it converges to the same solution but vastly slower. Driven by intuition from manifold convexity, we then pr...

متن کامل

Matrix Manifold Optimization for Gaussian Mixtures

We take a new look at parameter estimation for Gaussian Mixture Model (GMMs). Specifically, we advance Riemannian manifold optimization (on the manifold of positive definite matrices) as a potential replacement for Expectation Maximization (EM), which has been the de facto standard for decades. An out-of-the-box invocation of Riemannian optimization, however, fails spectacularly: it obtains the...

متن کامل

Optimization by Gaussian Processes assisted Evolution Strategies

Evolutionary Algorithms (EA) are excellent optimization tools for complex high-dimensional multimodal problems. However, they require a very large number of problem function evaluations. In many engineering optimization problems, like high throughput material science or design optimization, a single fitness evaluation is very expensive or time consuming. Therefore, standard evolutionary computa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Journal of Computational and Graphical Statistics

سال: 2021

ISSN: ['1061-8600', '1537-2715']

DOI: https://doi.org/10.1080/10618600.2021.1923516